A nonlinear least squares quasi-Newton strategy for LP-SVR hyper-parameters selection
نویسندگان
چکیده
This paper studies the problem of hyperparameters selection for a linear programming-based support vector machine for regression (LP-SVR). The proposed model is a generalized method that minimizes a linear-least squares problem using a globalization strategy, inexact computation of first order information, and an existing analytical method for estimating the initial point in the hyper-parameters space. The minimization problem consists of finding the set of hyper-parameters that minimizes any generalization error function for different problems. Particularly, this research explores the case of two-class, multi-class, and regression problems. Simulation results among standard data sets suggest that the algorithm achieves statistically insignificant variability when measuring the residual error; and when compared to other methods for hyper-parameters search, the proposed method produces the lowest root mean squared error in most cases. Experimental analysis suggests that the proposed approach is better suited for large-scale applications for the particular case of an LP-SVR. Moreover, due to its mathematical formulation, the proposed method can be extended in order to estimate any number of hyper-parameters.
منابع مشابه
LP-SVR Model Selection Using an Inexact Globalized Quasi-Newton Strategy
In this paper we study the problem of model selection for a linear programming-based support vector machine for regression. We propose generalized method that is based on a quasi-Newton method that uses a globalization strategy and an inexact computation of first order information. We explore the case of two-class, multi-class, and regression problems. Simulation results among standard datasets...
متن کاملQuasi-newton Methods for Nonlinear Least Squares Focusing on Curvatures
Most existing quasi-Newton methods for nonlinear least squares problems incorporate both linear and nonlinear information in the secant update. These methods exhibit good theoretical properties, but are not especially accurate in practice. The objective of this paper is to propose quasi-Newton methods that only update the nonlinearities. We show two advantages of such updates. First, fast conve...
متن کاملSuperlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملUsing an Efficient Penalty Method for Solving Linear Least Square Problem with Nonlinear Constraints
In this paper, we use a penalty method for solving the linear least squares problem with nonlinear constraints. In each iteration of penalty methods for solving the problem, the calculation of projected Hessian matrix is required. Given that the objective function is linear least squares, projected Hessian matrix of the penalty function consists of two parts that the exact amount of a part of i...
متن کاملA secant method for nonlinear least-squares minimization
Quasi-Newton methods have played a prominent role, over many years, in the design of effective practical methods for the numerical solution of nonlinear minimization problems and in multi-dimensional zero-finding. There is a wide literature outlining the properties of these methods and illustrating their performance [e.g., [8]]. In addition, most modern optimization libraries house a quasi-Newt...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Int. J. Machine Learning & Cybernetics
دوره 5 شماره
صفحات -
تاریخ انتشار 2014